Anger from campaigners as WhatsApp lowers age limit to 13 in UK and EU




Campaigners have reacted with anger to the social media company Meta lowering age restrictions for WhatsApp users to 13 in the UK and EU.

The change to the messaging app, which reduces the age limit from 16 to 13, was announced in February and came into force in the UK and EU on Wednesday.

The campaign group Smartphone Free Childhood said the move by Meta, which also owns Facebook and Instagram, defied demand for tech companies to do more to protect children.

The group said: “This flies in the face of the growing national demand for big tech to do more to protect our children.

“Officially allowing anyone over the age of 12 to use their platform (the minimum age was 16 before today) sends a message that it’s safe for children.

“But teachers, parents and experts tell a very different story. As a community we’re fed up with the tech giants putting their shareholder profits before protecting our children.”

WhatsApp said the change was bringing the age limit in line with the majority of countries and that protections were in place.

The director of online safety strategy at Ofcom, the UK communications regulator, said it “won’t hesitate” to fine social media companies that fail to follow its directions, once it has the power to do so.

Mark Bunting told BBC Radio 4’s Today programme that the watchdog was writing codes of practice for enforcing online safety. “So when our powers come into force next year, we’ll be able to hold them to account for the effectiveness of what they’re doing,” he said.

“If they’re not taking those steps at that point, and they can’t demonstrate to us that they’re taking alternative steps which are effective at keeping children safe, then we will be able to investigate.

“We have powers to direct them to make changes, if we believe changes are necessary to make.

“If they don’t comply with those directions, we do have powers to levy fines – and we won’t hesitate to use those powers – if there’s no other way of driving the change that we think is needed.”

Meta this week unveiled a range of safety features designed to protect users, in particular young people, from “sextortion” and intimate image abuse.

It confirmed it will begin testing a filter in direct messages (DMs) on Instagram, called Nudity Protection, which will be on by default for those aged under 18 and will automatically blur images sent to users that are detected as containing nudity.

When receiving nude images, users will also see a message urging them not to feel pressure to respond, and an option to block the sender and report the chat.